AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Hybrid SSM Architecture

# Hybrid SSM Architecture

Plamo 2 8b
Other
PLaMo 2 8B is an 8-billion-parameter hybrid architecture language model developed by Preferred Elements, supporting English and Japanese text generation.
Large Language Model Transformers Supports Multiple Languages
P
pfnet
401
19
Zamba 7B V1 Phase1
Apache-2.0
Zamba-7B-v1-phase1 is a hybrid architecture combining state space model Mamba with Transformer, using Mamba as the backbone network and sharing one Transformer layer every six modules, trained via next-token prediction.
Large Language Model Transformers
Z
Zyphra
22
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase